13,080 research outputs found

    The strategy and realization of enterprise integration

    Full text link
    University of Technology, Sydney. Faculty of Information Technology.In recent years we have experienced exponential growth in business innovation. emerging technology, and integration complexity. With this unprecedented growth, the priority of enterprise integration has shifted from patching solutions to the governance of agility. Enterprise integration mainly deals with interoperability between virtual and physical worlds, which is thorny by its very nature. In order to cope with rising complexity, interference coherence between business, service. and physical components is crucial. Instead of consolidation from fragmentation, an iteration approach is taken in driving concept and strategy into realization. The empirical statistics indicate that the anatomy of ontological research is essential for producing an overview of interoperability. The author's numerous research projects demonstrate a number of factors critical in generating higher productivity and lower risk. These factors include a higher visibility of atomic elements. a well-specified service, and a precise architectural alignment. By taking these successful factors into realization, this thesis proposes enterprise vertical integration, employing a three-step strategy of componentization, transformation, and virtualization. Componentization derives an ontology of atomic elements for the service-based foundation. In transformation. service components are produced from these raw elements, using a multi-discipline and three-dimensional approach to achieve component synthesis. The final step, virtualization, is the objective of enterprise integration. Virtualization establishes the enterprise skeleton and achieves a common-service mainstream in the industry. Experiential evidence indicates that this higher-level, three-step approach works effectively in minimizing risk and increasing productivity. There is particular benefit for projects of higher complexity and larger scale. Given the incessant business change inherent in our chaotic new age of computing, the three-step approach relies on a new framework to streamline realization and cope with project complexity. A Method, Evaluation, Techniques, and Application (META) framework addresses the interference between virtual and physical layers. In this initial process it develops component validation, analysis processes, and synthesis techniques for service transformation. It then develops service components and common services for service virtualization. This thesis proposes a four-pillared approach to support the META framework. It also proposes sub-area concepts such as "pattern" and "state" to enhance the capability of the framework before moving it into the industry mainstream. This thesis distinguishes itself from existing literature in that very few studies in this field address real enterprise-scale integration. None of the reviewed literature copes with the fundamental work of enterprise issues such as ontological research or high-level strategy as proposed by this thesis

    Progress in compilation of the 1:2,000,000-scale topographic map

    Get PDF
    The application of special photogrammetric techniques has enabled the systematic mapping of Mars' topography at a scale of 1:2,000,000, using high-altitude Viking Orbiter pictures. In fiscal 86, compilation was completed of the 24 subquadrangles that make up the quadrangles MC-12, MC-13, MC-14, MC-15, MC-20, and MC-21. This work completes compilation of the 60 topographic maps covering the equatorial belt (lat. + or - 30 deg). The remaining 80 subquadrangles of Mars are planned to be completed within 3 years (27, 27 and 26 subquadrangles, in fiscal 87, 88, and 89, respectively). Elevations on all topographic maps are relative to the Mars topographic datum. The maps have a contour interval of 1 km and a precision of + or - 1 km. The equatorial-belt maps are Mercator projections having true scale at lat. + or - 27.476 deg. These maps provide more precise information than do those previously available and they will help in understanding the geologic processes that have shaped the Martian surface

    Using Information Filtering in Web Data Mining Process

    Get PDF
    Web service-oriented Grid is becoming a standard for achieving loosely coupled distributed computing. Grid services could easily be specified with web-service based interfaces. In this paper we first envisage a realistic Grid market with players such as end-users, brokers and service providers participating co-operatively with an aim to meet requirements and earn profit. End-users wish to use functionality of Grid services by paying the minimum possible price or price confined within a specified budget, brokers aim to maximise profit whilst establishing a SLA (Service Level Agreement) and satisfying end-user needs and at the same time resisting the volatility of service execution time and availability. Service providers aim to develop price models based on end-user or broker demands that will maximise their profit. In this paper we focus on developing stochastic approaches to end-user workflow scheduling that provides QoS guarantees by establishing a SLA. We also develop a novel 2-stage stochastic programming technique that aims at establishing a SLA with end-users regarding satisfying their workflow QoS requirements. We develop a scheduling (workload allocation) technique based on linear programming that embeds the negotiated workflow QoS into the program and model Grid services as generalised queues. This technique is shown to outperform existing scheduling techniques that don't rely on real-time performance information

    Generalized gauge transformations: pure Yang-Mills case

    Get PDF
    • ā€¦
    corecore